Sorry, your browser cannot access this site
This page requires browser support (enable) JavaScript
Learn more >

Normal Distribution

Definition:

For $\mu \in \mathbb{R}$ and $\sigma > 0$, we call a distribution with the density function

the normal distribution $N(\mu, \sigma^2)$.

For $\mu = 0$ and $\sigma^2 = 1$, this reduces to the density

of the standard normal distribution $N(0, 1)$.


Theorem:

For $\mu \in \mathbb{R}$ and $\sigma > 0$, the function

is indeed a density function.


Proof: Using the substitution $y = \frac{x - \mu}{\sigma}$, so that $dx = \sigma \, dy$, we have:

where

Note that this integral exists. Consider:

We switch to polar coordinates, i.e., we substitute

We compute the Jacobian determinant:

So we obtain:

Substitute $u = \frac{r^2}{2}$, so that $du = r \, dr$:

Hence, we conclude:

.$\square$

Theorem: Let $X \sim N(\mu,\sigma^2)$ be a random variable, then


Proof:

Expectation:

Substitute $z = \frac{x - \mu}{\sigma} \Rightarrow x = \sigma z + \mu$, and $dx = \sigma dz$:

First notice that $z \cdot e^{-z^2/2}$ is an odd function. Since

we have

(Do keep in mind: if $f$ is an odd function, then

is a necessary condition of $\int_{\infty}^{\infty}f(x)dx = 0$ . Because otherwise, $f(x)=x$ is a simple counterexample.)

Moreover, we have seen in the last proof that

Therefore,

Variance:

We compute:

Again substitute $z = \frac{x - \mu}{\sigma} \Rightarrow x = \sigma z + \mu$, and $dx = \sigma dz$:

,where

In addition, since $\phi’(z) = z\phi(z)$,

Finally, we have

.$\square$